Entropy and thinning of discrete random variables

نویسنده

  • Oliver Johnson
چکیده

We describe five types of results concerning information and concentration of discrete random variables, and relationships between them. These results are motivated by their counterparts in the continuous case. The results we consider are information theoretic approaches to Poisson approximation, the maximum entropy property of the Poisson distribution, discrete concentration (Poincaré and logarithmic Sobolev) inequalities, monotonicity of entropy and concavity of entropy in the Shepp–Olkin regime. 1 Results in continuous case In this paper we give a personal review of a number of results concerning the entropy and concentration properties of discrete random variables. For simplicity, we only consider independent sets of random variables (though it is an extremely interesting open problem to extend many of the results to the dependent case). These results are generally motivated by their counterparts in the continuous case, which we will briefly review, using notation which holds only for Section 1. For simplicity we restrict our attention in this Section to random variables taking values in R. For any probability density p, write λp = ∫∞ −∞ xp(x)dx for its mean and Varp = ∫∞ −∞ (x − λp)p(x)dx for its variance. We write h(p) for the differential entropy of p, and interchangeably write h(X) for X ∼ p. Similarly we write D(p‖q) or D(X‖Y ) for relative entropy. We write φμ,σ2(x) for the density of Gaussian Zμ,σ2 ∼ N(μ, σ). Given a function f , we wish to measure its concentration properties with respect to probability density p; we write λp(f) = ∫∞ −∞ f(x)p(x)dx for the expectation of f with respect to p, write Varp(f) = ∫∞ −∞ p(x)(f(x)− λp(f))dx for the variance and define

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

ENTROPY FOR DTMC SIS EPIDEMIC MODEL

In this paper at rst, a history of mathematical models is given.Next, some basic information about random variables, stochastic processesand Markov chains is introduced. As follows, the entropy for a discrete timeMarkov process is mentioned. After that, the entropy for SIS stochastic modelsis computed, and it is proved that an epidemic will be disappeared after a longtime.

متن کامل

Relative Entropy Rate between a Markov Chain and Its Corresponding Hidden Markov Chain

 In this paper we study the relative entropy rate between a homogeneous Markov chain and a hidden Markov chain defined by observing the output of a discrete stochastic channel whose input is the finite state space homogeneous stationary Markov chain. For this purpose, we obtain the relative entropy between two finite subsequences of above mentioned chains with the help of the definition of...

متن کامل

On entropy for mixtures of discrete and continuous variables

Let X be a discrete random variable with support S and f : S → S′ be a bijection. Then it is wellknown that the entropy of X is the same as the entropy of f(X). This entropy preservation property has been well-utilized to establish non-trivial properties of discrete stochastic processes, e.g. queuing process [1]. Entropy as well as entropy preservation is well-defined only in the context of pur...

متن کامل

Log-concavity and the maximum entropy property of the Poisson distribution

We prove that the Poisson distribution maximises entropy in the class of ultralog-concave distributions, extending a result of Harremoës. The proof uses ideas concerning log-concavity, and a semigroup action involving adding Poisson variables and thinning. We go on to show that the entropy is a concave function along this semigroup. 1 Maximum entropy distributions It is well-known that the dist...

متن کامل

A Novel Method for Increasing the Entropy of a Sequence of Independent, Discrete Random Variables

In this paper, we propose a novel method for increasing the entropy of a sequence of independent, discrete random variables with arbitrary distributions. The method uses an auxiliary table and a novel theorem that concerns the entropy of a sequence in which the elements are a bitwise exclusive-or sum of independent discrete random variables.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1510.05390  شماره 

صفحات  -

تاریخ انتشار 2015